Critical to evaluating the capacity, scalability, and availability of websystems are realistic web traffic generators. Web traffic generation is aclassic research problem, no generator accounts for the characteristics of webrobots or crawlers that are now the dominant source of traffic to a web server.Administrators are thus unable to test, stress, and evaluate how their systemsperform in the face of ever increasing levels of web robot traffic. To resolvethis problem, this paper introduces a novel approach to generate synthetic webrobot traffic with high fidelity. It generates traffic that accounts for boththe temporal and behavioral qualities of robot traffic by statistical andBayesian models that are fitted to the properties of robot traffic seen in weblogs from North America and Europe. We evaluate our traffic generator bycomparing the characteristics of generated traffic to those of the originaldata. We look at session arrival rates, inter-arrival times and sessionlengths, comparing and contrasting them between generated and real traffic.Finally, we show that our generated traffic affects cache performance similarlyto actual traffic, using the common LRU and LFU eviction policies.
展开▼